約 5,703,241 件
https://w.atwiki.jp/api_programming/pages/194.html
下位ページ Content PrerequisitesCreate authorization credentials Identify access scopes Obtaining OAuth 2.0 access tokensStep 1 Configure the client object Step 2 Redirect to Google's OAuth 2.0 serverサンプル(Sample redirect to Google's authorization server) OAuth 2.0 for Client-side Web Applications OAuth 2.0 for Client-side Web Applications This document explains how to implement OAuth 2.0 authorization to access Google APIs from a JavaScript web application. OAuth 2.0 allows users to share specific data with an application while keeping their usernames, passwords, and other information private. For example, an application can use OAuth 2.0 to obtain permission from users to store files in their Google Drives. This OAuth 2.0 flow is called the implicit grant flow. It is designed for applications that access APIs only while the user is present at the application. These applications are not able to store confidential information. In this flow, your app opens a Google URL that uses query parameters to identify your app and the type of API access that the app requires. You can open the URL in the current browser window or a popup. The user can authenticate with Google and grant the requested permissions. Google then redirects the user back to your app. The redirect includes an access token, which your app verifies and then uses to make API requests. Note Given the security implications of getting the implementation correct, we strongly encourage you to use OAuth 2.0 libraries when interacting with Google s OAuth 2.0 endpoints. It is a best practice to use well-debugged code provided by others, and it will help you protect yourself and your users. See the JS Client Library tabs in this document for examples that show how to authorize users with the Google APIs Client Library for JavaScript. Prerequisites Enable APIs for your project Any application that calls Google APIs needs to enable those APIs in the API Console. To enable the appropriate APIs for your project Open the Library page in the API Console. Select the project associated with your application. Create a project if you do not have one already. Use the Library page to find each API that your application will use. Click on each API and enable it for your project. Create authorization credentials Any application that uses OAuth 2.0 to access Google APIs must have authorization credentials that identify the application to Google s OAuth 2.0 server. The following steps explain how to create credentials for your project. Your applications can then use the credentials to access APIs that you have enabled for that project. Open the Credentials page in the API Console. Click Create credentials OAuth client ID. Complete the form. Set the application type to Web application. Applications that use JavaScript to make authorized Google API requests must specify authorized JavaScript origins. The origins identify the domains from which your application can send API requests. Identify access scopes Scopes enable your application to only request access to the resources that it needs while also enabling users to control the amount of access that they grant to your application. Thus, there may be an inverse relationship between the number of scopes requested and the likelihood of obtaining user consent. Before you start implementing OAuth 2.0 authorization, we recommend that you identify the scopes that your app will need permission to access. The OAuth 2.0 API Scopes document contains a full list of scopes that you might use to access Google APIs. Obtaining OAuth 2.0 access tokens The following steps show how your application interacts with Google s OAuth 2.0 server to obtain a user s consent to perform an API request on the user s behalf. Your application must have that consent before it can execute a Google API request that requires user authorization. Step 1 Configure the client object If you are using Google APIs client library for JavaScript to handle the OAuth 2.0 flow, your first step is to configure the gapi.auth2 and gapi.client objects. These objects enable your application to obtain user authorization and to make authorized API requests. The client object identifies the scopes that your application is requesting permission to access. These values inform the consent screen that Google displays to the user. The Choosing access scopes section provides information about how to determine which scopes your application should request permission to access. JS CLIENT LIBRARYOAUTH 2.0 ENDPOINTS If you are directly accessing the OAuth 2.0 endpoints, you can proceed to the next step. Step 2 Redirect to Google s OAuth 2.0 server To request permission to access a user s data, redirect the user to Google s OAuth 2.0 server. JS CLIENT LIBRARYOAUTH 2.0 ENDPOINTS Generate a URL to request access from Google s OAuth 2.0 endpoint at https //accounts.google.com/o/oauth2/v2/auth. This endpoint is accessible over HTTPS; plain HTTP connections are refused. The Google authorization server supports the following query string parameters for web server applications Parameters client_id 必須The client ID for your application. You can find this value in the API Console. redirect_uri 必須ユーザが認証を行った後、API サーバがリダイレクトする場所を指定。この値は API Console で指定したリダイレクトURLのどれかと正確に一致している必要がある。http or https, case, ( / ) まですべて一致。 response_type 必須JavaScript アプリケーションでは token を指定する。この指示により Google 認証サーバは アクセストークンを name=value のペアで、ハッシュ (#) fragment をつけて、返すようになる。 scope 必須A space-delimited list of scopes that identify the resources that your application could access on the user s behalf. These values inform the consent screen that Google displays to the user. Scopes enable your application to only request access to the resources that it needs while also enabling users to control the amount of access that they grant to your application. Thus, there is an inverse relationship between the number of scopes requested and the likelihood of obtaining user consent. The OAuth 2.0 API Scopes document provides a full list of scopes that you might use to access Google APIs. We recommend that your application request access to authorization scopes in context whenever possible. By requesting access to user data in context, via incremental authorization, you help users to more easily understand why your application needs the access it is requesting. state RecommendedSpecifies any string value that your application uses to maintain state between your authorization request and the authorization server s response. The server returns the exact value that you send as a name=value pair in the hash (#) fragment of the redirect_uri after the user consents to or denies your application s access request. You can use this parameter for several purposes, such as directing the user to the correct resource in your application, sending nonces, and mitigating cross-site request forgery. Since your redirect_uri can be guessed, using a state value can increase your assurance that an incoming connection is the result of an authentication request. If you generate a random string or encode the hash of a cookie or another value that captures the client s state, you can validate the response to additionally ensure that the request and response originated in the same browser, providing protection against attacks such as cross-site request forgery. See the OpenID Connect documentation for an example of how to create and confirm a state token. include_granted_scopes OptionalEnables applications to use incremental authorization to request access to additional scopes in context. If you set this parameter s value to true and the authorization request is granted, then the new access token will also cover any scopes to which the user previously granted the application access. See the incremental authorization section for examples. login_hint OptionalIf your application knows which user is trying to authenticate, it can use this parameter to provide a hint to the Google Authentication Server. The server uses the hint to simplify the login flow either by prefilling the email field in the sign-in form or by selecting the appropriate multi-login session. Set the parameter value to an email address or sub identifier. promptOptional. A space-delimited, case-sensitive list of prompts to present the user. If you don t specify this parameter, the user will be prompted only the first time your app requests access. Possible values are noneDo not display any authentication or consent screens. Must not be specified with other values. consentPrompt the user for consent. select_accountPrompt the user to select an account. サンプル(Sample redirect to Google s authorization server) An example URL is shown below, with line breaks and spaces for readability. https //accounts.google.com/o/oauth2/v2/auth ?scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive.metadata.readonly include_granted_scopes=true state=state_parameter_passthrough_value redirect_uri=http%3A%2F%2Foauth2.example.com%2Fcallback response_type=token client_id=client_id After you create the request URL, redirect the user to it. JavaScript sample code The following JavaScript snippet shows how to initiate the authorization flow in JavaScript without using the Google APIs Client Library for JavaScript. Since this OAuth 2.0 endpoint does not support Cross-origin resource sharing (CORS), the snippet creates a form that opens the request to that endpoint. /* * Create form to request access token from Google s OAuth 2.0 server. */ function oauthSignIn() { // Google s OAuth 2.0 endpoint for requesting an access token var oauth2Endpoint = https //accounts.google.com/o/oauth2/v2/auth ; // Create form element to submit parameters to OAuth 2.0 endpoint. var form = document.createElement( form ); form.setAttribute( method , GET ); // Send as a GET request. form.setAttribute( action , oauth2Endpoint); // Parameters to pass to OAuth 2.0 endpoint. var params = { client_id YOUR_CLIENT_ID , redirect_uri YOUR_REDIRECT_URI , response_type token , scope https //www.googleapis.com/auth/drive.metadata.readonly , include_granted_scopes true , state pass-through value }); // Add form parameters as hidden input values. for (var p in params) { var input = document.createElement( input ); input.setAttribute( type , hidden ); input.setAttribute( name , p); input.setAttribute( value , params[p]); form.appendChild(input); } // Add form to page and submit it to open the OAuth 2.0 endpoint. document.body.appendChild(form); form.submit(); } Step 3 Google prompts user for consent In this step, the user decides whether to grant your application the requested access. At this stage, Google displays a consent window that shows the name of your application and the Google API services that it is requesting permission to access with the user s authorization credentials. The user can then consent or refuse to grant access to your application. Your application doesn t need to do anything at this stage as it waits for the response from Google s OAuth 2.0 server indicating whether the access was granted. That response is explained in the following step. Step 4 Handle the OAuth 2.0 server response JS CLIENT LIBRARYOAUTH 2.0 ENDPOINTS The OAuth 2.0 server sends a response to the redirect_uri specified in your access token request. If the user approves the request, then the response contains an access token. If the user does not approve the request, the response contains an error message. The access token or error message is returned on the hash fragment of the redirect URI, as shown below An authorization code response https //oauth2.example.com/callback#access_token=4/P7q7W91 token_type=Bearer expires_in=3600 In addition to the access_token parameter, the query string also contains the token_type parameter, which is always set to Bearer, and the expires_in parameter, which specifies the lifetime of the token, in seconds. If the state parameter was specified in the access token request, its value is also included in the response. An error response https //oauth2.example.com/callback#error=access_denied Note Your application should ignore any additional, unrecognized fields included in the query string. Sample OAuth 2.0 server response You can test this flow by clicking on the following sample URL, which requests read-only access to view metadata for files in your Google Drive https //accounts.google.com/o/oauth2/v2/auth? scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive.metadata.readonly include_granted_scopes=true state=state_parameter_passthrough_value redirect_uri=http%3A%2F%2Foauth2.example.com%2Fcallback response_type=token client_id=client_id After completing the OAuth 2.0 flow, you will be redirected to http //localhost/oauth2callback. That URL will yield a 404 NOT FOUND error unless your local machine happens to serve a file at that address. The next step provides more detail about the information returned in the URI when the user is redirected back to your application. The code snippet in step 5 shows how to parse the OAuth 2.0 server response and then validate the access token. Step 5 Validate the user s token JS CLIENT LIBRARYOAUTH 2.0 ENDPOINTS If the user has granted access to your application, you must explicitly validate the token returned in the hash fragment of the redirect_uri. By validating, or verifying, the token, you ensure that your application is not vulnerable to the confused deputy problem. To validate the token, send a request to https //www.googleapis.com/oauth2/v3/tokeninfo and set the token as the access_token parameter s value. The following URL shows a sample request https //www.googleapis.com/oauth2/v3/tokeninfo?access_token= access_token Google s authorization server responds to the request with a JSON object that either describes the token or contains an error message. If the token is valid, the JSON object includes the properties in the following table Fields audThe application that is the intended user of the access token. Important Before using the token, you need to verify that this field s value exactly matches your Client ID in the Google API Console. This verification ensures that your application is not vulnerable to the confused deputy problem. expires_inThe number of seconds left before the token becomes invalid. scopeA space-delimited list of scopes that the user granted access to. The list should match the scopes specified in your authorization request in step 1. useridThis value lets you correlate profile information from multiple Google APIs. It is only present in the response if you included the profile scope in your request in step 1. The field value is an immutable identifier for the logged-in user that can be used to create and manage user sessions in your application. The identifier is the same regardless of which client ID is used to retrieve it. This enables multiple applications in the same organization to correlate profile information. A sample response is shown below { "aud" "8819981768.apps.googleusercontent.com", "user_id" "123456789", "scope" "https //www.googleapis.com/auth/drive.metadata.readonly", "expires_in" 436 } If the token has expired, been tampered with, or had its permissions revoked, Google s authorization server returns an error message in the JSON object. The error surfaces as a 400 error and a JSON body in the format shown below. {"error" "invalid_token"} By design, no additional information is given as to the reason for the failure. Note In practice, a 400 error typically indicates that the access token request URL was malformed, often due to improper URL escaping. The JavaScript snippet below parses the response from Google s authorization server and then validates the access token. If the token is valid, the code stores it in the browser s local storage. You could modify the snippet to also send the token to your server as a means of making the token available to other API clients. var queryString = location.hash.substring(1); var params = {}; var regex = /([^ =]+)=([^ ]*)/g, m; while (m = regex.exec(queryString)) { params[decodeURIComponent(m[1])] = decodeURIComponent(m[2]); // Try to exchange the param values for an access token. exchangeOAuth2Token(params); } /* Validate the access token received on the query string. */ function exchangeOAuth2Token(params) { var oauth2Endpoint = https //www.googleapis.com/oauth2/v3/tokeninfo ; if (params[ access_token ]) { var xhr = new XMLHttpRequest(); xhr.open( POST , oauth2Endpoint + ?access_token= + params[ access_token ]); xhr.onreadystatechange = function (e) { var response = JSON.parse(xhr.response); // Verify that the aud property in the response matches YOUR_CLIENT_ID. if (xhr.readyState == 4 xhr.status == 200 response[ aud ] response[ aud ] == YOUR_CLIENT_ID) { localStorage.setItem( oauth2-test-params , JSON.stringify(params) ); } else if (xhr.readyState == 4) { console.log( There was an error processing the token, another + response was returned, or the token was invalid. ) } }; xhr.send(null); } } Calling Google APIs JS CLIENT LIBRARYOAUTH 2.0 ENDPOINTS After your application obtains an access token, you can use the token to make calls to a Google API on behalf of a given user account or service account. To do this, include the access token in a request to the API by including either an access_token query parameter or an Authorization Bearer HTTP header. When possible, the HTTP header is preferable, because query strings tend to be visible in server logs. In most cases you can use a client library to set up your calls to Google APIs (for example, when calling the Drive API). You can try out all the Google APIs and view their scopes at the OAuth 2.0 Playground. HTTP GET examples A call to the drive.files endpoint (the Drive API) using the Authorization Bearer HTTP header might look like the following. Note that you need to specify your own access token GET /drive/v2/files HTTP/1.1 Authorization Bearer access_token Host www.googleapis.com/ Here is a call to the same API for the authenticated user using the access_token query string parameter GET https //www.googleapis.com/drive/v2/files?access_token= access_token curl examples You can test these commands with the curl command-line application. Here s an example that uses the HTTP header option (preferred) curl -H "Authorization Bearer access_token " https //www.googleapis.com/drive/v2/files Or, alternatively, the query string parameter option curl https //www.googleapis.com/drive/v2/files?access_token= access_token JavaScript sample code The code snippet below demonstrates how to use CORS (Cross-origin resource sharing) to send a request to a Google API. This example does not use the Google APIs Client Library for JavaScript. However, even if you are not using the client library, the CORS support guide in that library s documentation will likely help you to better understand these requests. In this code snippet, the access_token variable represents the token you have obtained to make API requests on the authorized user s behalf. The complete example demonstrates how to store that token in the browser s local storage and retrieve it when making an API request. var xhr = new XMLHttpRequest(); xhr.open( GET , https //www.googleapis.com/drive/v3/about?fields=user + access_token= + params[ access_token ]); xhr.onreadystatechange = function (e) { console.log(xhr.response); }; xhr.send(null); Complete example JS CLIENT LIBRARYOAUTH 2.0 ENDPOINTS This code sample demonstrates how to complete the OAuth 2.0 flow in JavaScript without using the Google APIs Client Library for JavaScript. The code is for an HTML page that displays a button to try an API request. If you click the button, the code checks to see whether the page has stored an API access token in your browser s local storage. If so, it executes the API request. Otherwise, it initiates the OAuth 2.0 flow. For the OAuth 2.0 flow, the page follows these steps It directs the user to Google s OAuth 2.0 server, which requests access to the https //www.googleapis.com/auth/drive.metadata.readonly scope. After granting (or denying) access, the user is redirected to the original page, which parses the access token from the query string. The page validates the access token and, if it is valid, executes the sample API request. The API request calls the Drive API s about.get method to retrieve information about the authorized user s Google Drive account. If the request executes successfully, the API response is logged in the browser s debugging console. You can revoke access to the app through the Permissions page for your Google Account. The app will be listed as OAuth 2.0 Demo for Google API Docs. To run this code locally, you need to set values for the YOUR_CLIENT_ID and REDIRECT_URI variables that correspond to your authorization credentials. The REDIRECT_URI should be the same URL where the page is being served. Your project in the Google API Console must also have enabled the appropriate API for this request. html head /head body script var YOUR_CLIENT_ID = REPLACE_THIS_VALUE ; var YOUR_REDIRECT_URI = REPLACE_THIS_VALUE ; var queryString = location.hash.substring(1); // Parse query string to see if page request is coming from OAuth 2.0 server. var params = {}; var regex = /([^ =]+)=([^ ]*)/g, m; while (m = regex.exec(queryString)) { params[decodeURIComponent(m[1])] = decodeURIComponent(m[2]); // Try to exchange the param values for an access token. exchangeOAuth2Token(params); } // If there s an access token, try an API request. // Otherwise, start OAuth 2.0 flow. function trySampleRequest() { var params = JSON.parse(localStorage.getItem( oauth2-test-params )); if (params params[ access_token ]) { var xhr = new XMLHttpRequest(); xhr.open( GET , https //www.googleapis.com/drive/v3/about?fields=user + access_token= + params[ access_token ]); xhr.onreadystatechange = function (e) { console.log(xhr.response); }; xhr.send(null); } else { oauth2SignIn(); } } /* * Create form to request access token from Google s OAuth 2.0 server. */ function oauth2SignIn() { // Google s OAuth 2.0 endpoint for requesting an access token var oauth2Endpoint = https //accounts.google.com/o/oauth2/v2/auth ; // Create element to open OAuth 2.0 endpoint in new window. var form = document.createElement( form ); form.setAttribute( method , GET ); // Send as a GET request. form.setAttribute( action , oauth2Endpoint); // Parameters to pass to OAuth 2.0 endpoint. var params = { client_id YOUR_CLIENT_ID, redirect_uri YOUR_REDIRECT_URI, scope https //www.googleapis.com/auth/drive.metadata.readonly , state try_sample_request , include_granted_scopes true , response_type token }; // Add form parameters as hidden input values. for (var p in params) { var input = document.createElement( input ); input.setAttribute( type , hidden ); input.setAttribute( name , p); input.setAttribute( value , params[p]); form.appendChild(input); } // Add form to page and submit it to open the OAuth 2.0 endpoint. document.body.appendChild(form); form.submit(); } /* Verify the access token received on the query string. */ function exchangeOAuth2Token(params) { var oauth2Endpoint = https //www.googleapis.com/oauth2/v3/tokeninfo ; if (params[ access_token ]) { var xhr = new XMLHttpRequest(); xhr.open( POST , oauth2Endpoint + ?access_token= + params[ access_token ]); xhr.onreadystatechange = function (e) { var response = JSON.parse(xhr.response); // When request is finished, verify that the aud property in the // response matches YOUR_CLIENT_ID. if (xhr.readyState == 4 xhr.status == 200 response[ aud ] response[ aud ] == YOUR_CLIENT_ID) { // Store granted scopes in local storage to facilitate // incremental authorization. params[ scope ] = response[ scope ]; localStorage.setItem( oauth2-test-params , JSON.stringify(params) ); if (params[ state ] == try_sample_request ) { trySampleRequest(); } } else if (xhr.readyState == 4) { console.log( There was an error processing the token, another + response was returned, or the token was invalid. ) } }; xhr.send(null); } } /script button onclick="trySampleRequest();" Try sample request /button /body /html Incremental authorization In the OAuth 2.0 protocol, your app requests authorization to access resources, which are identified by scopes. It is considered a best user-experience practice to request authorization for resources at the time you need them. To enable that practice, Google s authorization server supports incremental authorization. This feature lets you request scopes as they are needed and, if the user grants permission, add those scopes to your existing access token for that user. For example, an app that lets people sample music tracks and create mixes might need very few resources at sign-in time, perhaps nothing more than the name of the person signing in. However, saving a completed mix would require access to their Google Drive. Most people would find it natural if they only were asked for access to their Google Drive at the time the app actually needed it. In this case, at sign-in time the app might request the profile scope to perform basic sign-in, and then later request the https //www.googleapis.com/auth/drive.file scope at the time of the first request to save a mix. The following rules apply to an access token obtained from an incremental authorization The token can be used to access resources corresponding to any of the scopes rolled into the new, combined authorization. When you use the refresh token for the combined authorization to obtain an access token, the access token represents the combined authorization and can be used for any of its scopes. The combined authorization includes all scopes that the user granted to the API project even if the grants were requested from different clients. For example, if a user granted access to one scope using an application s desktop client and then granted another scope to the same application via a mobile client, the combined authorization would include both scopes. If you revoke a token that represents a combined authorization, access to all of that authorization s scopes on behalf of the associated user are revoked simultaneously. The code samples below show how to add scopes to an existing access token. This approach allows your app to avoid having to manage multiple access tokens. JS CLIENT LIBRARYOAUTH 2.0 ENDPOINTS To add scopes to an existing access token, include the include_granted_scopes parameter in your request to Google s OAuth 2.0 server. The following code snippet demonstrates how to do that. The snippet assumes that you have stored the scopes for which your access token is valid in the browser s local storage. (The complete example code stores a list of scopes for which the access token is valid by setting the oauth2-test-params.scope property in the browser s local storage.) The snippet compares the scopes for which the access token is valid to the scope you want to use for a particular query. If the access token does not cover that scope, the OAuth 2.0 flow starts. Here, the oauth2SignIn function is the same as the one that was provided in step 2 (and that is provided later in the complete example). var SCOPE = https //www.googleapis.com/auth/drive.metadata.readonly ; var params = JSON.parse(localStorage.getItem( oauth2-test-params )); var current_scope_granted = false; if (params.hasOwnProperty( scope )) { var scopes = params[ scope ].split( ); for (var s = 0; s scopes.length; s++) { if (SCOPE == scopes[s]) { current_scope_granted = true; } } } if (!current_scope_granted) { oauth2SignIn(); // This function is defined elsewhere in this document. } else { // Since you already have access, you can proceed with the API request. } Revoking a token In some cases a user may wish to revoke access given to an application. A user can revoke access by visiting Account Settings. It is also possible for an application to programmatically revoke the access given to it. Programmatic revocation is important in instances where a user unsubscribes or removes an application. In other words, part of the removal process can include an API request to ensure the permissions granted to the application are removed. JS CLIENT LIBRARYOAUTH 2.0 ENDPOINTS To programmatically revoke a token, your application makes a request to https //accounts.google.com/o/oauth2/revoke and includes the token as a parameter curl -H "Content-type application/x-www-form-urlencoded" \ https //accounts.google.com/o/oauth2/revoke?token={token} The token can be an access token or a refresh token. If the token is an access token and it has a corresponding refresh token, the refresh token will also be revoked. Note Google s OAuth 2.0 endpoint for revoking tokens supports JSONP and form submissions. It does not support Cross-origin Resource Sharing (CORS). If the revocation is successfully processed, then the status code of the response is 200. For error conditions, a status code 400 is returned along with an error code. The following JavaScript snippet shows how to revoke a token in JavaScript without using the Google APIs Client Library for JavaScript. Since the Google s OAuth 2.0 endpoint for revoking tokens does not support Cross-origin Resource Sharing (CORS), the code creates a form and submits the form to the endpoint rather than using the XMLHttpRequest() method to post the request. function revokeAccess(accessToken) { // Google s OAuth 2.0 endpoint for revoking access tokens. var revokeTokenEndpoint = https //accounts.google.com/o/oauth2/revoke ; // Create form element to use to POST data to the OAuth 2.0 endpoint. var form = document.createElement( form ); form.setAttribute( method , post ); form.setAttribute( action , revokeTokenEndpoint); // Add access token to the form so it is set as value of token parameter. // This corresponds to the sample curl request, where the URL is // https //accounts.google.com/o/oauth2/revoke?token={token} var tokenField = document.createElement( input ); tokenField.setAttribute( type , hidden ); tokenField.setAttribute( name , token ); tokenField.setAttribute( value , accessToken); form.appendChild(tokenField); // Add form to page and submit it to actually revoke the token. document.body.appendChild(form); form.submit(); } Note Following a successful revocation response, it might take some time before the revocation has full effect.
https://w.atwiki.jp/rokuroku/pages/12.html
902 名前:dormcat[sage] 投稿日:2007/02/27(火) 18 01 38 ID ???0 魚介類さんと皆さんへ: This is my first time posting on 2ch. I m afraid that using Japanese at my awful level would create more misunderstandings, so please forgive me using English instead. And first I have to state that this post is my personal opinion and does not represent ANN or its other staff. I d like to thank you for your understanding. I know it well that we fans can barely tolerate incorrect information, and seeing one s beloved title lacking important key animators can be frustrating. Reporting errors to ANN Encyclopedia could be peaceful and reasonable, as I demonstrated in the error report example I wrote on ANN forum. Unfortunately, some short-tempered (no offense intended) users at 2ch quickly established threads here, as well as sending numerous not-so-friendly emails to ANN and adding a special "reliability" section in ANN s entry in Wikipedia Japan, with incorrect relationship between ANN and Anime Expo (there was no such relationship at all). The amount of angry mails was quite unsettling, plus the ever-increasing criticism (such as lack of Japanese-capable staff, reviewing unlicensed titles, etc.), a defensive wall has been erected in our heart. While the incident had cooled down for over two weeks, Milk-san s post brought me back to the an issue I thought long resolved. The last sentence of his post ("By the way, those Otakus outside of Japan really know the difference of key-animation and between animation?(and what chief animater is?) I just wonder.") was particularly irritating that was almost discriminating in my eyes (I didn t know if he had such intention when he wrote it, though). As a database relied on user submission (like Wikipedia), ANN can never guarantee its Encyclopedia to be completely error-free, but the constant criticisms from 2channelers, asking for a "perfect" Encyclopedia, never stop. The entire website, as well as its every staff, were put under a microscope in order to pick out every single imperfection. While constructive criticisms are welcomed, posts saturated with anger have an opposite effect we started to think, "I maintain this database with no material compensation at all; only my passion and love to anime keep me doing this. Now why should I suffer this unnecessary anger? I can simply drop everything, go away, and pick up a normal life." This feeling was particularly strong after seeing my net ID "stolen" by others in order to edit ANN s entry in Wikipedia Japan. Seeing my long time net ID being used against what I spend most of my free time on made me just snapped. This incident also makes me wonder why aren t there more Japanese fans participating ANN community? Its users are from all over the world, so language should not be the biggest concern. Furthermore, submitting correct new information and/or error reports of existing information require minimal language skills, as many of our major contributors are not active forum participants. I can assure that, as long as you keep it peaceful and civil, native Japanese users are highly welcomed at ANN, for your language and cultural background can allow you accessing first hand information, and your opinions are likely to be regarded as more important. I admit that there were numerous misunderstandings between us. One issue that many repliers keep mentioning was I called some 2channelers "kitchen boys" (厨房). This was targeted only to those who kept "ranting" -- shouting with "korea is not relation with this anime at all!" or "suspecting" ANN or me had received bribery from Koreans. My apology to 魚介類さん, Milk-san, 784, and other reasonable 2channelers who had provided valuable services and suggestions, but I refuse to apologize those who hinted ANN or myself had taken immoral benefits from a third party, as well as those who said errors on ANN were "intentionally fabricated" (捏造). Right now Yukikaze and Macross Plus have been fixed. I browsed this 2ch thread and found some titles have their ANN entry incomplete and waiting to be amended, such as Metropolis Tenjho Tenge OVA Please let us know (calmly, please) if there are other errors in the Encyclopedia. Now, can we shake hand and announce cease fire? 魚介類さんと皆さんへ: これは、初めての2chへの投稿です。日本語が下手だからもっと誤解されると困るので、英語を使う事を許して欲しい。 まず言わなければならない事は、私がANNとか他のスタッフをを代表している訳では無い事。理解してくれると有り難い。私達ファンは、間違った情報に我慢出来ない事はよく解っている。好きな作品の重要なアニメーターが書かれていない事は、苛立たしい事だ。ANNへの間違いの指摘は、普通に筋が通っている。ANNフォーラムへ書かれた間違いを指摘するレポートの例を示したように。 残念ながら、ある気の早い人が、2chにスレを立てた。攻撃意図は無いだろうが。 そして、勿論、あまり好意的でないメールがANNに送られて、WikipediaのANNに「内容の信憑性」が付け加えられている。ANNとAnime Expoの不正確な関係も。関係は全く無いのに。沢山の怒りのメールは、とても、慌しく、さらに、どんどん 批判が増えている。日本語能力のあるスタッフが居ないとか、ライセンスされてないと書かれているタイトルの見直しについて、などへの批判だ。 自分を守ろうとする壁がココロの中に出来た。このハプニングは、二週間で落ち着いてきたけど、ミルクさんは投稿を、解決したと思っていた問題を持ち出してきた。最後の節、 「所で海外のオタク達は、本当に違いを知っているの?原画とか動画とか、チーフアニメーターが何か」 というのは、特にいらいらした。その位解るから。彼がそういう意図で書いたのかどうか不明だけど。データーベースは、 Wikipediaと同じでユーザー達が書いている。ANNは、完全に正確である事を保証できない。2chねらからいつもある批判は、完全なデーターベースを求めてやまない。全てのスタッフだけでなく、全てのウェブは、顕微鏡で粗探しをされている。一つ一つの間違いを取り上げる為に。 建設的な批判は、望む所だが、怒りに満ちた投稿は、逆効果だ。「見返りもなく、このデーターベースを維持している。アニメへの愛がこれを続けさせている。なんでこんなに、不必要な、嫌な事ばっかりあるの?全てを捨てて普通の生活に戻る事も出来るし」と思い始めている。 この感覚は、特に日本のWikipediaのANNの編集の為に他の人が私のネットIDを盗んだのを見たとき特に感じた。 (日本のWikipedia編集のdormcatは、私じゃない) 長い間、自由時間の大半を使ってきた事と違う事に使われている。ずっと使ってきたネットIDなのに、ただ、腹が立ってしょうがない。 この事件は、また、解らなくさせている。ANNに参加する日本のファンは、なんでもっと居ないのか。ユーザーは世界中から来ていて、言語は重大な障害ではないはず。さらに、正しい新しい情報を出したり、間違いの報告は、英語などの言語 能力は、最低限でいい。多くの主なそういう人は、フォーラムに書き込まないので。 確信できるが、平和的で、礼儀正しい人なら、日本人のユーザーはとてもANNに歓迎される。日本語とか日本文化の背景によって、直接的な情報を吟味する事が出来るから。そして、貴方の意見は、より重要な物として扱われるだろう。 私達の間には沢山の誤解があった事を認める。 多くの相手が言い続けている、一つの問題は、(厨房)の事だ。これは、ただ、怒鳴り散らす人の事を言っているだけだ。例えば、「韓国はこのアニメと関係ない」とか「ANNやお前は韓国から裏で金を貰っているだろう」とか。 魚介類さん, Milk-san, 784,そして、他の、まともな2chねら、意味のある世話とか意見を出してくれた人々には詫びないといけない。 でも、ANNとか私が悪い金を他所から貰っていると仄めかす人々や、勿論、捏造しているとかいう人々への謝罪なんか無い。今、雪風とマクロスプラスは、直された。このスレを見ていて、幾つかのタイトルは、ANNで不完全で、直さなければならない事が解った。例えば、メトロポリスとか天上天下のOVAとか。間違いがあれば、知らせて欲しい(どうか落ち着いてください)。 今、私達は握手して、停戦出来ますよね? では。 dorcatより。
https://w.atwiki.jp/swfspec/pages/58.html
ExportAssets ExportAssets タグは、 SWF ファイルの一部が他の SWF ファイルからインポート可能であることを示します (ImportAssets 参照)。 例えば、ある SWF ファイルに埋め込まれたカスタムフォントの Font キャラクタをエクスポートして、同じ Web サイト上の 10 個の SWF ファイル間で共有するというようなことができます。 エクスポートされたそれぞれのキャラクタは、文字列で識別できます。 フォント以外のキャラクタもエクスポート可能です。 ExportAssets タグの中のキャラクタ ID に同じ値のものが 2 回以上出現した場合は、最後に定義された Name 識別子が使われます。 ExportAssets は SWF 5 以降で使用可能です。 ExportAssets フィールド 型 コメント Header RECORDHEADER タグタイプ = 56 Count UI16 エクスポートするアセットの数 Tag1 UI16 エクスポートする最初のキャラクタの ID Name1 STRING エクスポートする最初のキャラクタの識別子 ・・・ TagN UI16 エクスポートする最後のキャラクタの ID NameN STRING エクスポートする最後のキャラクタの識別子 移動 前のページ End 次のページ ImportAssets
https://w.atwiki.jp/mrfrtech/pages/71.html
Market Analysis The global Smart Commute Market Growth is predicted to touch USD 104.22 billion and at a whopping 25.52% CAGR between 2020- 2027, states the recent Market Research Future (MRFR) analysis. Smart commute, simply put, is traveling from a particular place to another regularly through metro, bicycle pooling, bike pooling, vanpooling, and others. Traffic management, parking management, smart ticketing, mobile app, and others are different solutions. These services provide eco-friendly traveling experience to people and make an active alternative to transportation for enterprise employees that can pick from effective commute option, which along with reducing the congestion, also helps in lowering the transportation cost. Various factors are propelling the global smart commute app market share. According to the recent MRFR report, such factors include high demographic rates, growing urban population, the use of smart and connected technologies in transportation infrastructure, growing demand for transportation as a service, decline in ownership of vehicles with shared mobility. Additional factors adding market growth include several upcoming railway projects, growing urbanization and industrialization, demand for public transport and smart transportation solutions as well as related components, services, and software, advances in technology like the implementation of electronic payment systems, traveler information systems, and automatic vehicle location systems, and benefits such as an increase in traveler convenience, reduce traffic congestion, lower emission levels, and improve fuel economy. On the contrary, high component cost, concerns about security and management data storage, the slow growth rate of GDP, susceptibility to cyberattacks, and the on-going COVID-19 impact are factors that may impede the global smart commute application market growth over the forecast period. Get a Free Sample @ https //www.marketresearchfuture.com/sample_request/6975 Market Segmentation The MRFR report highlights an inclusive segmental analysis of the global smart commute market based on solution and type. Based on type, the global smart commute market is segmented into metro, bicycle pooling, bike pooling, vanpooling, carpooling, and others. Based on the solution, the global smart commute market is segmented into traffic management, parking management, smart ticketing, mobile app, and others. Regional Analysis Based on the region, the global smart commute market report covers the recent trends and growth opportunities across the Asia Pacific (APAC), North America, Europe, the Rest of the World (RoW). Of these, the APAC region is predicted to have the lions share over the forecast period. Strict government norms and regulations related to greenhouse gas emissions, adoption of car sharing services, and the upcoming availability of zero-emission car sharing services are adding to the global smart commute market growth in the region. The global smart commute market in Europe is predicted to have healthy growth over the forecast period. Improved socio-economic conditions in France, the UK, and Germany are adding to the global smart commute market growth in the region. The global smart commute market in North America is predicted to have sound growth over the forecast period. Favorable incentives introduced by the government for promoting carpooling services are adding to the global smart commute market growth in the region. The global smart commute market in the RoW is predicted to have steady growth over the forecast period. Key Players Leading contenders profiled in the global smart commute market report include Carma Technology Corporation (Europe), Turo (US), BlaBlaCar (France), CommuteSMART (US), Oakland Smart Commute (California), Central Indiana Regional Transportation Authority (CIRTA) (US), ZipGo Technologies Pvt. Ltd (India), Metrolinx (Canada), ANI Technologies Pvt. Ltd.(India), Uber Technologies Inc. (India), Quick Ride (India), ePoolers Technologies Pvt. Ltd. (India), and South Florida Commuter Services (US). Industry players have incorporated several strategies such as mergers, new product launches, strategic alliances, geographic expansions, extensive R D activities, new product development, and others to stay at the forefront. Brows Full Report @ https //www.marketresearchfuture.com/reports/smart-commute-market-6975 Table of Contents 1 Executive Summary 2 Scope of The Report 2.1 Market Definition 2.2 Scope of The Study 2.2.1 Research Objectives 2.2.2 Assumptions Limitations 2.3 Market Structure Continued… Similar Report** B2B Telecommunication Market Information by Solution (Unified Communication and Collaboration), Deployment (Fixed, Mobile), Organization Size (Large, Enterprise), Application (Industrial, Commercial) and regions Trending #MRFR Report** https //ictmrfr.blogspot.com/2022/04/geofencing-market-companies-growth-with.html https //blogfreely.net/pranali004/telecom-expense-management-market-size-impressive-cagr-changing-business-scope https //postheaven.net/pranali004/financial-app-industry-impressive-cagr-changing-business-needs-scope-of https //market-research-future.tribe.so/post/openstack-service-market-research-impressive-cagr-changing-scope-of-current--6263de46791566c10c79891e https //www.scutify.com/articles/2022-04-24-infrastructure-as-a-service-industry-cagr-changing-business-scope-of-current-and-future-industry- About Market Research Future At Market Research Future (MRFR), we enable our customers to unravel the complexity of various industries through our Cooked Research Report (CRR), Half-Cooked Research Reports (HCRR), Raw Research Reports (3R), Continuous-Feed Research (CFR), and Market Research Consulting Services. Contact Market Research Future (Part of Wantstats Research and Media Private Limited) 99 Hudson Street, 5Th Floor New York, NY 10013 United States of America 1 628 258 0071 (US) 44 2035 002 764 (UK) Email sales@marketresearchfuture.com Website https //www.marketresearchfuture.com
https://w.atwiki.jp/techsure/pages/37.html
このページはhttp //www.vgleaks.com/durango-sound-of-tomorrow/からの引用です Xbox One (Durango) Sound of Tomorrow One of the few components that remain unveiled inXbox One(Durango)is thesound block. This article is intended to describe this important part inside the system. Xbox One (Durango) audio architecture seeks a balance between the successes and tradeoffs of previous generation platforms while anticipating the increasing technical needs of next-generation implementations. It provides hardware-accelerated pathways for the most common aspects of audio rendering—compression, mixing, filtering, and so on—on a large number of concurrent voices. The architecture also provides a shared resource model for software processing consumption, allowing each individual title to select what and how much custom signal manipulation to apply in CPU utilization. Audio Architectural Overview In addition to general CPU power (which can be used for decoding, synthesis, rendering, and so on), Durango provides several hardware components dedicated to audio processing. The audio hardware components can address the entire unified memory space. Here you can see the hardware-accelerated audio components The SHAPE (Scalable Hardware Audio Processing Engine) block comprises the majority of audio functionality, although the other processors also contribute significant features. SHAPE (Scalable Hardware Audio Processing Engine) The core hardware dedicated to audio processing is SHAPE. It is designed to perform many of the basic operations commonly required on a per-voice basis. This hardware allows a developer to reduce CPU impact—even for high polyphony and complex-signal routings—and still provide the flexibility of SHAPE/CPU data interchange if a title chooses to perform custom digital signal processing, analysis, or software synthesis. SHAPE operates on blocks of 128 samples, where each sample supports 24-bit integer resolution (or 32-bit float when used by the CPU). At 48 kHz, this represents a 2.67 ms audio frame, providing increased timing resolution and decreased latency compared with theXbox 360256 sample block size. SHAPE offers six fixed function blocks focused on common audio tasks 1. XMA Decoder Concurrent decodes of 512 XMA format voices. XMA is a perceptual codec developed for Xbox 360 offering user-tunable quality and typically providing between 6 1 and 14 1 compression. 2. SRC A high-quality dedicated polyphase sample rate conversion block allowing for high performance and high-quality frequency resampling of 512 mono channels of audio data (whether for format conversion, Doppler effect, or pitch variation). 3. Mix Buffers Dedicated accumulators for 128 in-place mix channels without needing to access memory, and with additional channels available virtually. These mix buffers also provide coarse metering and clipping detection for debugging and monitoring. 4. FLT/VOL A module providing both volume scaling and a state variable filter implementation for more than 2,500 voices/mixes, analogous to the software-exposed XAudio2 per-voice filter available on Windows and Xbox 360. The filter can provide low pass, high pass, band pass, or notch filtering, and exposes Q and cutoff/center frequency parameters. It is used most commonly for distance and occlusion modeling. 5. EQ/CMP A module providing up to 512 channels of 3-band equalization and dynamic range compression. The EQ is comprised of three serially cascaded biquad filters. The compressor has a hard-knee response, and supports both side chain and expander functionality. 6. DMA SHAPE has dedicated DMA hardware for transferring audio data to and from the unified memory space. This enables scenarios that include transfer without a sample-rate converter, transferring final mix channels, and CPU-based processing in the middle of a SHAPE-based audio graph. Playback of a typical audio graph is expected to use each of these processors extensively. ACP (Audio Control Processor) The ACP provides state management and scheduling of all other audio hardware components on the North Bridge. CPU involvement in intra-frame processing and the synchronization/latency it might introduce is unnecessary. ASP (AudioScalarProcessor) The ASP supports scalar float and vector integer operations. Voice chat codecs—both those that manage wireless communication between a voice chat headset and the console, and those that are used to compress/decompress voice data for networked voice communication—are provided in hardware. Additionally, this processor supports xWMA format decompression in hardware; on Xbox 360, xWMA was solely a CPU-side decode option AVP (Audio Vector Processor) The AVP supports vector float operations, and is designed primarily for MEC (multichannel echo cancellation) and other noise reduction for the next-generation Kinect audio input. It supports both speech recognition and chat/arbitrary audio input use. MEC and other noise reduction processing allow for a more intelligible stream of the player’s spoken audio data even from a far talk microphone that is typically positioned closer to the output speakers than to the player. Audio and Durango Hardware Durango’s audio output pipeline eliminates the DAC (digital-to-analog converter) found in previous generation consoles. All audio is output strictly in the digital realm either through HDMI 1.4a or as S/PDIF optical output. HDMI 1.4a allows for high-fidelity linear 7.1-channel PCM to be transmitted from the console; titles default to an output sampling rate of 48 kHz and a bit depth of 24 bits. Durango is also designed to support up to four simultaneous stereo headset outputs, each of which can represent unique multichannel mixes that are downmixed as required by the output format (for instance, a headset or the S/PDIF output). Durango accepts audio input from a variety of sources the next-generation Kinect microphone array, voice chat headsets, other audio input peripherals, and storage media (whether HDD, flash or from cloud storage). Audio also can be algorithmically generated through CPU-based computation and manipulated in real time on a CPU, through the aforementioned SHAPE hardware components, or both. Compression Formats Durango offers hardware decompression support for both XMA2 and xWMA, both of which provide significant storage, bandwidth, and memory reductions over uncompressed PCM. XAudio2 also offers software support for ADPCM (Adaptive Differential Pulse Code Modulation). Although the computation for the ADPCM format is low overhead, as a non-perceptual codec ADPCM can express noticeable artifacts at lower sampling rates. Format Compression (approximate) Durango Xbox 360 Loop capability PCM None Yes Yes Arbitrary ADPCM 3.5-4 1 (Software) Block aligned XMA2 6-14 1 Yes (512 Hardware) Yes (320 Hardware) 128 sample-aligned xWMA 20-40 1 Yes (Hardware + Software support) Yes (Software only) End to end only, may gap Additional audio formats—for instance, MP3 or OGG—for game assets can be provided through title or middleware software codecs running on a CPU. Audio and the Durango App Model While in the foreground, an application has full access to the SHAPE hardware. When that application is pushed to the background—pinned, picture-in-picture, or other scenarios—it relinquishes hardware control. By default, its hardware state is suspended, and resumes when the title returns to the foreground. This also is true for Exclusive Resource Applications [ERAs] where the software graph is suspended. A title may optionally choose to tear down its audio graph and reconstruct it upon resume. Some titles, particularly Shared Resource Applications [SRAs] that play background music such as streaming radio, may choose to have some aspects of audio continue to play even while paused. For these scenarios, titles should closely evaluate whether to attempt a seamless transition from hardware to software rendering, or to always play audio intended for background playback via a software-only pipeline. This has implications for compression formats and CPU costs. XMA-compressed assets, for example, require the use of SHAPE hardware, and thus will not be decodable for a background application. The XAudio2 audio engine does provide software pathways for many functions if a title chooses to allocate CPU resources. Where practical, these functions mimic hardware capabilities, but some compute intensive processing is either unavailable or is differently implemented in software. Titles transitioning from hardware processing to software processing based on an app’s state may want to consider these differences when planning their audio pipelines. Feature Durango Hardware Capability Durango Software Capability Equivalent? Sample Rate Conversion (SRC) polyphase XAudio2 linear interpolation No Parametric EQ (EQ/CMP) 3-band EQ 3-band EQ, simple one-band, or custom DSP Yes Compressor/Limiter (EQ/CMP) Hard-knee, side chain, and expander capabilities Hard-knee, side chain, and expander capabilities Yes Filtering (FLT/VOL) State variable filter XAudio2 state variable filter, single-pole LPF, or custom DSP Yes Mixing (Mix Buffers) Includes clip detection, metering Software mixing; custom DSP for clip detection or metering Yes (for mixing) Durango Audio Libraries Durango supports two audio rendering APIs for typical game use along with a variant of the Windows 8 Media Foundation API for playback of user music 1. XAudio2, a game-focused audio library already available on Xbox 360 and Windows operating systems (Windows XP to Windows 8), is generally recommended for most title development. 2. WASAPI (Windows Audio Session API) can be used for any custom, exclusively software-implemented pipeline. WASAPI provides audio endpoint functionality only. Decompression, sample-rate conversion, mixing, and digital-signal processing, as well as interactions with Durango’s audio hardware components, must be implemented by the client. WASAPI is most typically used by audio middleware solutions. The Microsoft Cross-Platform Audio Creation Tool (XACT) and DirectSound are not supported in the Durango environment. Titles that previously used these technologies should consider the solutions identified above, or use approved Durango audio middleware options.
https://w.atwiki.jp/battlestationsmidway/pages/15.html
―――メインメニュー――― Piloting a Ship 操船 November 17,1941. Welcome to the U.S. Naval Academy. In this mission you will learn how to use the camera and how to pilot a ship. 1941年11月17日 合衆国海軍兵学校へようこそ。 この作戦ではカメラの扱い方と操船を学びます。 ―――ロード画面――― November 17,1941 Piloting a ship 1941年11月17日 操船 Use Comera Reset to recenter your camera to look straight ahead 真っ正面を見るためには『カメラ・リセット』を使え Take care not to hit the coastline - collisions damage your ships 海岸線に激突するな―――衝突は船に損害を与える You can still control your ship s helm while in binocular mode 双眼鏡モードでも船は操縦できる ―――ゲーム画面――― Welcome to the U.S. Naval Academy. Today,you learn ship control. First,look at the instruments. 合衆国海軍兵学校へようこそ。 今日は操船を学ぶ。 まず、計器を見ろ。 Top right is the compass. It displays the heading of your ship, the direction your looking,and where about the enemy units. 右上のがコンパスだ。 君の船の方向、見ている方向、敵の位置を表示する。 Bottom right is your helm. It displays the setting of your engine and rudder,and your current speed. 右下が君の操舵装置だ。 エンジン、ラダーの設定と現在の速度を表示する。 Center screen is the cross hair. 画面中央にあるのが照準だ。 It shows you where aiming. どこを狙っているのかを示している。 Bottom left is the unit window. 左下がユニット・ウィンドウだ。 ( ) displays the information about the curret units. 現在の部隊の情報を表示している。 The initials to right of the name tell you the class. 名前の右にある頭文字は分類を示している。 DD stands for destroyer. DDとは駆逐艦のことだ。 You learn the instrument details in later turorials. 計器の詳細は後のチュートリアルで学ぶ。 Move the speed stick UP or DOWN to change engine setting. スピード・スティックを上げ下げしてエンジンの設定を変えろ。 Setting is a full speed , half and back. 設定は『全速』、『半速』そして『後退』だ。 "To change speed,move the left stick UP/Down" 『スピードを変えるには左スティックを上下に動かしてください』 Move the speed stick LEFT or RIGHT to set your rudder. スピード・スティックを左右に動かして舵を設定しろ。 Once set,your ship continue turn until you straight. 一度、設定されると、船はまっすぐに戻すまで旋回を続ける。 Yellow arrow pointed out the objective. 黄色の矢印は目標を示している。 Head at the yellow arrow. 黄色の矢印に向かえ。 "To change the rudder setting move the left stick LEFT/RIGHT" 『ラダーの設定を変えるには左スティックを左右に動かしてください』 Now,that was a splendid peace of navigation,sailor. さて、申し分のないみごとな航行だ、水兵。 While you are controling the ship,you can look any weapons in different direction (found on your trouble). Turn looking around. 周りを見まわしてみろ。 "To move the camera around the ship use the right stick" 『カメラを動かして周りを見るには右スティックを使用します』 Looks like you got it. つかんだようだな。 The indicator on the compass shows you the direction your looking compare to direction your heading. コンパスの表示には進行方向と比較した見ている方向が表示される。 You also have binoculars for taking close looking things. また、君はより拡大してみるために双眼鏡を持っている。 Use the binocular to zoom in on a carrier. 双眼鏡を使用して空母を拡大してみろ。 "To activate/deactivate the binoculars,press Y.Move the left stick UP/DOWN to zoom in/out." 『双眼鏡を使用するにはYを押してください。左スティックの上下で拡大/縮小します』 Excellent work,sailor. みごとだ、水兵。 "Training Course Completed" 『訓練完了』
https://w.atwiki.jp/hmiku/pages/36358.html
【登録タグ 456CD CD CDG ナポリPCD 全国配信】 前作 本作 次作 Inner Exodus GETTING smaller Cupid Power ナポリP 流通 即売 配信 発売 2015年4月26日 価格 ¥1,000 ¥1,000 サークル - CD紹介 ナポリP ことmiyake(MI8k)氏の3rd Album。 artwork by 456 2015年4月26日、ニコニコ超会議2015内THE VOC@LOiD 超 M@STER 31(超ボーマス31)で頒布。 BOOTHでダウンロード販売が行われている。 曲目 Recipe nothing Life of thieves トラベリングプレデター プライマルスケルツォ スレンダーマンウィドワーズ 不完全な処遇 21世紀忘却のキュリオシティ(x) ラプチャーステップ The Fallen ハッピーエンドを探しに リンク BOOTH コメント 名前 コメント
https://w.atwiki.jp/flstudio2/pages/184.html
InstaComposer ボタン1つでフレーズを生み出すAI作曲ツールInstaComposer2の紹介です。 InstaComposerライセンス認証 プリセットプリセットの選択方法 フレーズの自動生成 プリセットの追加方法 MIDIルーティング方法単一の音源でのルーティング 複数の音源へのルーティング複数の音源へルーティングした Pathcerのプリセット ドラムパートの生成 ライセンス認証 Menuから "About/Authorize" を選んでプロダクトキーを入れるとライセンス認証できます。 プリセット プリセットの選択方法 プリセットはプラグイン画面の上部をクリックして選びます。 フレーズの自動生成 フレーズの自動生成は、生成したいパートを選び(丸の番号をクリックして選択)、"GO" ボタンを押すことでフレーズが生成されます。 プリセットの追加方法 購入したプリセットを追加するには "Menu Open Preset Folder" を選びます。 Presets フォルダに購入したプリセットのフォルダ (.txtファイルが含まれるフォルダ) を配置します。 後はプリセットから選択可能となりますが、表示されない場合は "Refresh List" を選びます。 MIDIルーティング方法 単一の音源でのルーティング チャンネルラックに InstaComposer を読み込んで、歯車アイコンから "MIDI Output port" に "1" と設定します。 標準プラグインの場合はPatcherで設定し、外部プラグインを音源として使う場合であれば、歯車アイコンから "MIDI Input port" に "1" と指定することでルーティングは完了です。 複数の音源へのルーティング 複数の音源へのルーティングを行う場合は、Patcherから "VFX Color Mapper" を以下のように接続します。 InstaComposerのMIDI設定も忘れずにしておきます ("MIDI Output port" を "1" に設定)。 次にVFX Color Mapperを右クリックして、"Outputs Events" から "Voice output" の "1〜5" にチェックを入れます。 Voice outputは上から順番におおよそ「Melody」「Pluck」「Bass」「Pad」「Chord」で並んでいるので、例えば以下のように接続することでルーティングが完了します(順番はMIDI生成のプリセットによって若干役割が変化します)。 複数の音源へルーティングした Pathcerのプリセット 参考用にMIDIルーティングしたPatcherのプリセットを添付しておきます。 InstaComposer2.fst InstaComposer2_Trance.fst ドラムパートの生成 各プリセットにはドラムパターンは含まれていないのですが、手動で追加できます。 "Track 6" の "Mode Drums Auto" をクリックして "Dance" に変更します。 後は ⑥ をクリックして点灯させ、"GO" ボタンでドラムパターンが生成されます。(4つ打ちにしたくない場合は "Default" でも良いと思います) 後は VFX Color Mapper に "Voice output 6" を設定して、FPCなどをつなぎます。 FPCの音源は "Electronic 909" などが使いやすいと思います。
https://w.atwiki.jp/proko_translation/pages/25.html
Deliberate Practice The Secret of Getting Good Fast 計画的訓練 より早くうまくなる秘密 !-- 訳注 Deliberate Practiceは Deliberate Practiceで検索して出てきたものと 関係があるのか? http //lifehacking.jp/2010/01/deliberate-practice/ https //www.youtube.com/watch?v=wWuaQ84kGwI http //www.proko.com/deliberate-practice-the-secret-of-getting-good-fast/ After finishing the 2 demos in my last video, I realized that I made a big mistake in the drawings. 前回のビデオにおいて、二つの実演を行ったわけですが、 終わった後、私はドローイングにおいて大きな間違いを犯したことに 気づきました。 It’s the same mistake in both of them! 二つの実演両方で同じ間違いを犯してしまいました! I made the gesture too stiff. とても固くて動きのないジェスチャーを描いてしまいました。 Both of the reference images are very dynamic, but my drawings are not. 資料の画像は両方とも大変躍動的なのですが、 私のドローイングはそうではありません。 So, I thought this would be a great opportunity to address an important topic. それで、 私は、今回の間違いは 大変重要な項目を提供する大変良い機会だと 考えたわけです。 How to practice correctly and what to do when you make mistakes. どうやって正確に練習するか、 そして 間違った時に何をするのか ということです。 In fact making mistakes is a big part of practicing correctly. 実際、 間違うということは、 正しく練習することの 重要な部分を占めています。 When you make a mistake, (not if, but when you make a mistake) like I did with these drawings, analyze the mistake. あなたが間違えたとき、 (たとえそうでないとしても) 私がこれらのドローイングで犯したような間違いですが、 間違いを分析しましょう。 Don’t ignore it. このことを無視しないでください。 Figure out exactly what you did wrong and imagine in your mind what it would look like if you did it right. 何を間違ったのか見つけ出し、 もし正しく描いたとしたらどのように見えるのか 想像してみてください。 Then redraw it! そのあと、描き直してください! Draw the same thing, but this time, don’t make that same mistake. 同じものを描くのです。 しかし今回は、 同じ間違いを犯さないでください。 If you end up making the same mistake, then repeat the process. もし同じ間違いを犯す結果になってしまうのなら、 もう一度過程を繰り返しましょう。 Analyze your mistake, and try again. 間違いを分析し、 そして再び描きます。 Do it until you get it right. 良くなるまで繰り返してください。 That’s how you improve - not by practicing the same mistakes over and over again, but by fixing your mistakes. これが、あなたを改善させる方法です。 同じ間違いを何度も繰り返す練習をするのではなく、 あなたの間違いを改善するのです。 It’s better to do 1 drawing over and over, fixing your mistakes, than doing 10 different drawings and ignoring the mistakes. 一つのドローイングを、 間違いを直しながら 何回も繰り返すのがよりよいでしょう。 10の異なるドローイングを、ミスを無視しながら 描くよりもです。 So, I’m going to take my own advice and redo these drawings correctly. ですので、私自らそのアドバイスを受け入れ、 試しにこれらのドローイングを修正してみましょう。 Example 1 例1 Let’s starts with the first one. 最初の一枚から始めましょう。 Very extreme extension in the lumbar section. 腰椎部分に大変極端な伸展があります。 The pelvis is tilted forward, so we’re seeing a lot of the top plane of the cylinder. 骨盤は前方へ傾いているので、 骨盤を簡易化した円筒のてっぺんの面がより多く見えます。 Sides, showing the tilt. 側面は傾いて見えます。 And a bottom plane. そして底の面です。 Sometimes I like to add a centerline in there to show which way the front is pointing. 時々、 私はセンターラインを加えるのが好きです。 円筒形の平面がどっちを向いているのか示すためです。 And an angle between the ASIS landmarks. 上前腸骨棘の間の角度も考えます。 ASIS =anterior superior iliac spine 上前腸骨棘 Ok, so the relationship of the pelvis and rib cage is really important in this one, so before I do the spine, I want to find some gesture lines and a rough shape for the rib cage. はい、 骨盤と胸郭の関係は今回のドローイングでは大変重要ですので、 脊椎を描く前に、 ジェスチャーラインを見つけ、 胸郭のラフな形を描いておきたいと思います。 This will help me with placement of the spine. これは脊椎を置くとき大変助けになります。 Now Ill add the lumbar section with the extreme curvature pointing back. 次に、 後方に向かって大きく湾曲した 腰椎部分を描き加えます。 Add the thoracic section. 胸椎部分を加えます。 And that continues into the cervical section. そして胸椎部分は頸椎部分へつながっています。 That hole in the top of the rib cage isn’t visible from this angle since we’re looking up at the rib cage. 胸郭のてっぺんのこの穴はこのアングルからは見えません。 我々は今は胸郭を見上げているからです。 So this cylinder is going to be really flat. よって、この円筒形は大変平らになっています。 Angle of the sternum. 胸骨の角度。 Angle of the thoracic arch. 胸郭のアーチの角度。 And then follow that bottom rib around the back of the spine to the other side. 脊椎の周りをまわりながら反対側へ向かっている 肋骨の一番下を追いましょう。 And complete the thoracic arch. そして胸郭のアーチを終えましょう。 Forgot about some cross contour lines at the section divisions to help show which way the spine cylinder is pointing. 脊椎の円筒形がどっちの方向を向いているかを 見えやすくするために、 いくつかの交わっている等高線は無視しましょう。 Finally the skull. 最後に、頭蓋骨です。 It’s almost side view, but we’re looking up at it, so we will see a little bit of the bottom of the jaw… Side plane… Divide the front plane into thirds… And side plane of the jaw. 殆ど真横に見えます。 しかし、我々は今見上げていますので、 あごの底が少し見え、 側面、 前面を三つに分け、 最後は顎の側面です。 Now that looks much better than my first attempt. さて、最初のドローイングより良くなったと思います。 Definitely more dynamic gesture. 明らかにより躍動的なジェスチャーになりました。 Example 2 例2 My first attempt at this one was also too stiff. So, let’s try it again and make it dynamic. 私の最初のドローイングは、またもや固くて動きに欠けたものでした。 ですので、 もう一回挑戦し、より躍動的なものにしましょう。 The pelvis is tilted and the rib cage really thrusts forward. 骨盤は傾き、 胸郭は明らかに前に突き出ています。 That looks pretty good. 良くなりました。 Now let’s add the lumbar section of the spine. 腰椎部分を加えましょう。 Starts out pointing forward and then curves up to get that back extension. 最初の方では前を向き、 そして後方へ伸展するためにカーブしています。 Notice how I’m showing the top and bottom caps of the cylinder. 私がどのように円筒形の蓋を見せているかに 注意を払ってください。 This forces me to think about it as a 3d form rather than just curvy lines. 蓋の形をちゃんと描くためには、 単なる曲線ではなく 三次元の形を考えなければなりません。 Finding the relationship between the pelvis and rib cage. 骨盤と胸郭の間の関係を見つけてください。 A really important angle in this pose is the left edge from the pelvis to the ribcage. このポーズの大変重要な角度は、 骨盤から胸郭へ向かう、画面に向かって左側の輪郭です。 See how the muscles stretch tight there? どれぐらい筋肉がピンと伸ばされているか見えますか? This angle is really important to get right, otherwise I won’t capture that dynamic gesture. この角度を把握することは大変重要です。 さもなければこの躍動的なジェスチャーをつかみ損ねるでしょう。 Thoracic section leaning back. 胸椎部分は後ろへ傾いています。 Imagining the bottom rib swinging around the spine. 脊椎の周りに肋骨の底が回り込んでいることを しっかり想定してください。 And then add the thoracic arch to the bottom of the sternum… そして、胸郭のアーチを胸骨の底へ加えます。 And completing the shape of the rib cage. そして胸郭の形は完成です。 The cervical section leans to the right. 頸椎部分は右へ傾いています。 Start the construction of the head using the Loomis method. ルーミスメソッドを使いながら頭を描き始めます。 When doing this, I like to compare the cranium to the sternum. このことをやっている時、 私は頭蓋(cranium、顎の骨を除いた部分(たぶん))を胸骨と 比較するのが好きです。 They should be about the same size. それらは大体同じサイズのはずです。 Remember to consider foreshortening. 短縮遠近法のことは考慮に入れておいてください。 If the heads is leaning toward the camera, the cranium might be bigger than the sternum. もし頭がカメラに向かって傾いているのなら、 頭蓋は胸骨より大きく見えるでしょう。 And if it’s leaning away, the cranium might be smaller than the sternum. そして、もし頭が後ろへ傾いているなら、 頭蓋は胸骨より小さく見えるでしょう。 Extreme up tilt on this one. このモデルは極端に上方へ傾いています。 So, the brow ridge will be all the way up here. よって、眉山は、この辺りまで上がるでしょう。 nose here. 鼻はここ。 and chin here. そしてあご先はここです。 Then I can connect the jaw shape to the chin. その後、あごの形を顎先へつなげます。 Clean up some contour lines. 等高線を清書します。 And add a center line. そして中心線を加えます。 And just for fun, let’s add the angle between the ASIS landmarks. そして余談ですが、 上前腸骨棘の間の角度を描き加えてみます。 In the next lesson on the pelvis, I’ll show you why this angle is so important. 次の骨盤の授業で、 この角度がなぜ重要かお見せしましょう。 Ok, I think that looks pretty good! はい、大変良くなったように見えます。 If you did the assignments last week, go back and see if there’s anything that you can improve. もしあなたが先週の課題をやったなら、 何か改善できることがないか見直して見ましょう。 Do them over and over again until they seem easy! 簡単に見えるまで何度もやり直してください! Critiques !--訳注 ここからビデオの音声と不一致-- For additional help, watch my critique session on the spine. 追加の教材として、 生徒が描いた脊椎の課題に対する批評を見てください。 I go over student submitted work and provide insights on how they can improve their assignment examples. 私は生徒が行った課題を確認し、 彼らの課題をどのように改善することができるかを 提供しています。 If you’re posting your drawings, use hashtag #proko and don’t forget to follow me on Facebook and Instagram. If you like this video, share it with your friends, and if you want to be updated about new videos subscribe to the Proko newsletter. Filed in Anatomy ? Videos
https://w.atwiki.jp/ff14act/pages/24.html
解説 Import/ExportはACT用ログファイルの入出力設定を行う画面です。 主に、ACTの再起動やリフレッシュを行った後に、過去のログを再分析する際に利用します。 個人のデータのみに限るならば、「History Database」のほうが使い勝手がよいでしょう。 使い方